NumPy, Pandas & Web Scraping in Python – Simple Beginner Notes 📊🐍

NumPy, Pandas & Web Scraping – Simple Notes
In this lesson we learn NumPy, Pandas, and Web Scraping in simple English with examples.
Simple infographic explaining NumPy arrays, Pandas DataFrame, Web Scraping and API usage in Python
Educational infographic showing NumPy arrays, Pandas DataFrame operations, filtering, and web scraping using requests and BeautifulSoup in Python.


🔹 1. NumPy Arrays

NumPy is used for fast numerical calculations using arrays.

import numpy as np

array_1 = np.array([1,2,3])
array_2 = np.array([4,5,6])

print(array_1 + array_2)

✔ Output: [5 7 9]

Useful functions:

array = np.array([16,36,9])
print(np.sqrt(array))   # Square root
print(np.mean(array))   # Average

🔹 2D Arrays

array = np.array([[4,3,5],[9,5,8]])

print(array.shape)  # (2,3)
print(array.ndim)   # 2 dimensions
print(array.size)   # total elements
print(array.dtype)  # data type

✔ Python uses Zero-based indexing.

print(array[1,1])   # Access element

🔹 Filtering in NumPy

Use Boolean Indexing to filter values.

arr = np.array([10,50,40,20,30])
filtered = arr[arr > 25]
print(filtered)

🔹 2. Pandas DataFrame

A DataFrame is like an Excel table.

import pandas as pd

data = {
    'Name':['Alice','Bob'],
    'Age':[25,30]
}

df = pd.DataFrame(data)
print(df)

Access data:

print(df.loc[0,'Age'])   # label based
print(df.iloc[1,0])      # index based

Useful properties:

print(df.shape)
print(df.size)
print(len(df))
print(df.columns)

🔹 Add New Column

df['City'] = ['NY','LA']

🔹 Filtering

filtered_df = df[df['Age'] > 25]
print(filtered_df)

🔹 Aggregate Functions

print(df['Age'].sum())
print(df['Age'].mean())

🔹 Read CSV

df = pd.read_csv('file.csv')
print(df)

🔹 3. Web Scraping

Web Scraping means extracting data from websites.

🔹 GET Request

import requests

response = requests.get("https://api.github.com/events")
print(response.status_code)

🔹 POST Request

data = {'username':'user','password':'pass'}
response = requests.post("https://httpbin.org/post", data=data)

🔹 URL Parameters

params = {'userId':1}
response = requests.get(url, params=params)

🔹 4. BeautifulSoup

Used to extract data from HTML.

from bs4 import BeautifulSoup

html = "

Hello

" soup = BeautifulSoup(html,'html.parser') print(soup.find('h1').text)

🔹 5. API Example (Weather)

def fetch_weather(latitude, longitude):
    url = "https://api.open-meteo.com/v1/forecast"
    params = {"latitude":latitude,"longitude":longitude,"current":"temperature_2m"}

    res = requests.get(url, params=params)
    return res.json()

weather = fetch_weather(6.7,79.9)
print(weather)

🔹 6. Debugging

Use pdb or breakpoint() to debug.

import pdb
pdb.set_trace()

# OR
breakpoint()

🎯 These tools are very important for Data Science, Machine Learning, and Web Development.

📚 Related Articles

Article No Article Title & Link
1 🐍 Getting Started with Python for Data Science – Fundamentals Guide 🚀
2 🐍 Python Collections (Arrays) Simplified: List, Tuple, Set, & Dictionary 🐍
3 🐍 Understanding Python Lists in Simple Way 📋
4 🐍 Understanding Python Tuples in Simple Way 📦
5 🐍 Understanding Python Sets in Simple Way 🎴
6 🐍 Understanding Python Range & Dictionaries 📘
7 🐍Python Operators Guide: Arithmetic, Logical, & Precedence Explained 🐍
8 🐍 Python Control Statements Complete Guide 🔄
9 🐍 Python Functions Explained in Simple Way ⚙️
10 🐍 Python Map, Filter, Lambda & Modules – Simple Notes
11 🐍 Python OOP Concepts – Simple Short Notes
12 🐍  NumPy, Pandas & Web Scraping – Simple Python Notes📊

Post a Comment

Previous Post Next Post